For the twelfth bug of Christmas, my manager said to me:
"Tell them it's a feature
Say it's not supported
Change the documentation
Blame it on the hardware
Find a way around it
Say they need an upgrade
Reinstall the software
Ask for a dump
Run with the debugger
Try to reproduce it
Ask them how they did it and
See if they can do it again."
—The Twelve Bugs of Christmas - Anon.
How many lines of that chorus have you heard before? How many of those have been in the context of lame excuses for a late or faulty project? This chapter is intended to help you avoid being in that embarrassing position.
There are many excellent books available dedicated to the entire philosophy of software development, for example Code Complete by Steve McConnell of Microsoft. I strongly encourage you to obtain and study books of this kind as they will help you to get
things right the first time and save a lot of last-minute grief. It goes without saying that you should also read as much as you can of the Visual Basic 4.0 manuals before you start coding!
While it's true that Visual Basic 4.0 provides you with a wealth of tools and facilities to trace bugs and eliminate them, you should clearly note that these tools only have particular relevance to the Coding and Testing stages of the development
process. It is important that all the other aspects should also be attended to with equal intensity. Even the best debugging tools in the world are useless in the face of convoluted "spaghetti" code.
So, assuming you've read the debugging part of the Visual Basic 4.0 manuals, I'll complement it by spending the first few pages of this chapter on a whistle-stop tour of some of the concepts that can save you time in the long run if applied up-front,
and the rest in a discussion of generic faults that you can look for in any Visual Basic program.
Simply speaking, project development may be split into three stages in which defects may be found and reported. The stages are as follows:
Stages 1 and 3 usually rely on interactions between the customer and business analysts, often without any particular reference to yourself, the humble programmer. Visual Basic 4.0's debugging tools can only really help you with the later parts of Stage
2. Any significant fault that gets as far as the third stage should be treated as an indicator of critical failure of development procedure.
For the purposes of this chapter, we will assume that the specification stage has been carried out promptly and to the highest possible standard. While this is rarely true in practice, the first stage that you will probably be able to influence is the
Program Design stage—to optimize your progress, you should think about debugging and testing at this stage—not at the end of the project.
One of the major enhancements to Visual Basic 4.0 has been the introduction of class modules, which enables you to define your own objects. I can't emphasize enough how important it is to make sure the design and definitions of these objects are correct
and fixed before you start coding around them. This is even more important if you are working in a team environment where one person could change a class module without other team members finding out until much later and spending ages trying to understand
why their previously tested code doesn't work properly any more.
The testing of each component should be aimed at causing it to fail. In addition to testing the functionality, the product should be subject to invalid input, be stressed so as to check its management of resources, and be surrounded by barriers that
detect any attempt to use objects that it was not intended to access. A certain amount of this barrier functionality should also be placed within the application itself, subject to performance considerations. Be aware that removing debug code from your
application is effectively creating a new version of the software—which will then need to be retested from first principles.
In practice, you will find that no substantial piece of software is ever error-free. You will also (almost invariably) find that the sooner a defect is found, the easier and cheaper it is to fix. In the worst case, regular detection of errors in
production systems is invariably detrimental to your relationship with your customers.
On the other hand, an efficient testing regime speeds development by increasing the rate at which components can be successfully integrated into the system. Delivery of a quality product within the necessary time scale is crucial to the commercial
success of the project.
However, the recent pervasiveness of the Internet (with other networks and bulletin boards) has often tempted vendors to ship a less-than-perfect product. This is based on the idea that cheap fixes can be rapidly issued via modems. This is not a system
that a quality vendor will apply as a matter of course—due to the inability to control the application of the fix (given that the user is even aware of its existence) and the impression the user will get that the vendor's software is always seriously
faulty until at least the first service pack has been applied.
An important technique in Program Design is to encapsulate sections of the code into self-contained units whose purpose is sufficiently simple and definable that they may be rigorously tested and proved in isolation from other external considerations.
Visual Basic's event-driven model and support for modular design has always guided the programmer toward this in the past—Visual Basic 4.0 extends these facilities.
It should be clear that global variables are anathema to good program design because they are effectively undocumented and invisible parameters. Global constants are not quite so bad as long as their values do in fact remain constant throughout the life
of the project; any variation from this rule must be seen as a prime opportunity for errors to arise. Another potential source of errors arises when the name could equally imply a variable or a constant. For example, is EndOfList a defined constant or a
true/false Boolean? You can get around the need for global constants in Visual Basic 4.0 by using Type library (a .TLB file)—just add a typelib and you instantly have access to an object's methods and properties, including constants. You don't carry
any extra baggage in your .EXE—you only compile in what you use. Be aware of the implications for the system registry—see "Creating Object Applications" in Professional Features Book 1.
If well-designed, the components may also become reusable in later projects—which is the philosophy behind the .VBX (and now .OCX) components that have made Visual Basic such a successful development tool.
One of the major advances of Visual Basic 4.0 over previous versions of Visual Basic is the ability to encapsulate and include your code within libraries of reusable OLE objects. You should take every opportunity to use this facility offered by class
modules. For further information, read the Visual Basic 4.0 manuals and the relevant chapters of this book. For example, consider such things as About or Password dialogs. Once you have created and tested a generic version of each of these, why should you
ever have to bother doing another one? Reuse of such components also helps your projects conform to corporate look and feel guidelines of different projects, which are important from the perspective of reducing training costs when users are transferred
between departments.
Removal of "unreachable code" should be a regular process for developers. There are excellent tools available for this process such as VBCompress by WhippleWare—check for the availability of a Visual Basic 4.0 version. In practice,
however, you may wish to make exceptions for fully tested modules containing a set of standard, general-purpose functions where it might be less disruptive to retain the original version (instead of hacking different bits off to suit different projects
that might share that module).
Testing of completed components falls into two main areas :
Bug tracing naturally lags behind program coding. It follows that anything you can do to minimize this time lag will tend to reduce the opportunity for the defect to disrupt you and delay your application. It's therefore essential to devise and produce
a programmatic test-harness at the same time the component is designed and written. If it is not done concurrently as an integral part of the development process, it will probably never get done. A test-harness can also give you some information about the
component's speed without other influences confusing the results.
There are two further benefits from early allocation of resources for testing needs. First, the flattening out of the resource requirement for the test group so it will not have to cope with a sudden influx of work. Second, as soon as the tests are in
place, the developers will have an extra resource for debugging their code. Therefore, produce test programs to automatically exercise all the methods and properties exposed by the objects as part of the build process itself. Don't sign any component off
or release it for public use until it has passed the full set of tests. If a change is made to common code, then all other components using that code must also be subjected to the same procedure.
Once the component has passed the tests it must be allocated a version number that can be read at run-time by any programs using it. From that sign-off point forward, this version of the component is effectively cast in stone and the source code for
that version should be readily retrievable from a Version Control archive system. A major feature of Visual Basic 4.0 is its capability to accommodate add-ins, one of which enables links to Version Control systems. As Microsoft has decided to push the
SourceSafe product, you can be sure that it will play a large part in their plans—perhaps as a basis for an object repository? In the meantime, consider whether you might be better off with a VCS that supports multiple hardware platforms.
The addition of resource files and version information is a further major improvement in Visual Basic 4.0. Previous versions required use of Borland's Resource Workshop or Desaware's VersionStamper to retrospectively add version numbers to Visual
Basic-produced .EXE's (C++ components are versioned in their .RC file at final build). See Microsoft's Knowledge Base article number Q107992 for any VB3 versioning needs you may have. This is available on the Developer Network CDs or via GO MSKB on
CompuServe.
If a bug is found, an extra set of tests must be added immediately to the test-harness to verify the corrected component, which then must pass the entire enlarged suite of tests before it is allocated a new version number and signed off. When the time
for system issue comes around, you can then define a set of approved component versions that have been tested together for that release.
To determine whether an approved combination of components is being run, your application must then be coded with a function call to interrogate all the components at start-up. If another application has been installed since this one, overwriting one or
more of your set of components, you can then display a warning to that effect and (if required) prevent the application from continuing until the correct versions are replaced. This procedure may save you hours of support calls that aren't helped by the
fact that users cannot be seen as a reliable source of information about the exact nature of a fault. You can't control what other software may be running or what hardware it is running on—but that's no excuse for failing to control the aspects that
are under your jurisdiction.
System Testing is when all the fully tested and signed-off components have been assembled into a larger whole and the correctness of their interactions is verified. If possible, it is better to create subassemblies of components (which can have their
own test-harness) for subsystem testing rather than throwing everything together at once. To use a biological analogy, test the branches individually before testing the whole tree.
Regression Testing is when a complete program is systematically retested after each build in order to verify that its behavior only differs from the previous build in the desired manner, and no other. It is invaluable for detecting complex defects
missed by unit level tests, in that an unpredicted change in output is often an indication of error.
The only practical way to properly regression-test is to use one of the software tools that can automate the running of your application. They should include facilities to save at each stage of the run (Record mode), the states and contents of all the
objects in the current form, and compare this set of master states with later builds (Playback mode) so that unintended changes manifesting themselves in user interface differences can be picked up immediately.
For example, here's an analysis of a simple Visual Basic 3.0 project's object states and menu states as evaluated by SQA TeamTest v3.0—a high-end and, compared to MS Test, fairly expensive tool. It is not clear at the time of writing (June 1995)
whether TeamTest will be available for 32-bit Visual Basic 4.0 developments under Win95/NT operating systems, but this should give you an idea of what may be achieved. See Figures 13.1 and 13.2.
It's useful to get some statistics on what percentage of your system is actually being exercised by such tests—you may be surprised at how much of your system is hardly ever used.
Figure 13.1. An example of a regression test tool.
Figure 13.2. An example of a regression test tool.
Programmers have a duty to issue components for testing only when they consider them to be entirely correct—as long as time pressures permit. However, a common misconception is that testing is somehow separate from the main process of development.
This usually results in the scheduling of testing requirements not being considered as important as the scheduling of programming tasks, which leads to time and resources for testing being squeezed into a frantic last-minute panic, and any chance of
efficiency and thoroughness is squandered.
If the testing is relegated to the end of the product development cycle, any time spent testing must compete with the commercial need to ship the product. The inevitable result is a product being shipped later, and with a lower level of reliability and
perceived quality than could have been achieved. There is also a risk of the software being tested in too narrow a set of conditions with the effect that in highly diverse environments it may not be sufficiently resilient.
Clearly, there will be more, and less efficient, sequences in which testing can take place. Some components may require other components before they can do anything useful, so a critical path analysis proves useful. A schedule of programming tasks
should be devised with this in mind and in consultation with the testing personnel. This critical path analysis helps you optimize your testing strategy by showing which portions of the project can be tested concurrently, and the extent of any rollback
required in the testing procedure should a bug be discovered in previously signed-off components. It also aids you in your regular reports to your superiors.
It is therefore vital that the testing stage reached by any element of the system can be identified at any time. The validity of dependent tests will be uncertain if this is not the case.
A common problem with the hiring of testing personnel or assigning testing tasks is that testing in particular is often regarded as a menial and tedious task. In fact, it is a specialized task, requiring a good technical knowledge and a thorough
understanding of the system design. Without such knowledge it is impossible to devise meaningful and efficient tests; therefore, it is essential that personnel with appropriate capabilities are made available for the whole process.
Given sufficient abilities, it is often productive for testers to have access to the source code to enable them to "rig" the code with their own monitoring and analysis tools—that is, to add temporary code to yield more information about
the programs operation. However, this may be a source of political friction in some workplaces.
Nevertheless, there is nothing like the threat of public review to inspire an immediate improvement in code quality. Go for it! Find a secure area and pin regularly updated printouts of all the project's source code on the wall for the whole development
team to add their comments to! You may also have an opinion on the value of some kind of reward scheme for consistent low-error coding.
When undertaking code reviews, it is important to prevent the discussion from degenerating into an argument. A publicized checklist of both preferred and unacceptable coding practices can help prevent this as well as reduce the likelihood of the faults
occurring in the first place. If you're a manager, you might like to set up a spreadsheet with types of errors given a weighting (according to severity) and multiplied by the number of occurrences. You can then derive a "Code Quality Index"
(whose value should tend to zero over time) for each code module once it has stabilized. Whether such practices are cost and time efficient depend on how mission-critical your software is.
It's possible to define the characteristics of a piece of code that are necessarily incorrect, regardless of any functional specification that the code may represent.
For example, consider the following dialog box with absolutely no code behind it (see Figure 13.3).
Figure 13.3. A dialog box with no code behind it that has an error.
Can you see what's wrong? Yes—there's a clash of accelerator keys!
The first button has its first "m" underlined, the second button has its second "m" underlined. This program is inherently defective (unless, of course, you're trying to demonstrate hot-key clashes!).
Multiple use of accelerator assignments is not incorrect when the objects involved are always mutually invisible. For example, multiple assignments are acceptable within different menus or tab dialogs. Nevertheless, it is best to detect the possibility
of a problem and discount it later instead of ignoring the possibility of error altogether.
Detection of these kinds of errors is essentially an extension of the process of syntax checking. The C language has long had tools, Lint for example, to inspect the code for generic faults. Raising the compilation Warning Level to 4 in Visual C++ does
much the same job.
As such tools are currently not available for Visual Basic, you must consider writing a few programs of your own, although it may not be thought an effective use of time to go too far down this road. You should prioritize your available time and
resources accordingly.
By way of example, here's a simple method to find accelerator key clashes. If you've never saved a VB form in ASCII format and inspected it with Notepad, do it now! If you re-create the preceding dialog, you get something like this :
VERSION 4.00 Begin VB.Form Form1 Caption = "Form1" ClientHeight = 735 ClientLeft = 2970 ClientTop = 1980 ClientWidth = 2775 Height = 1080 Left = 2940 LinkTopic = "Form1" ScaleHeight = 735 ScaleWidth = 2775 Top = 1665 Width = 2835 Begin VB.CommandButton Command2 Caption = "Com&mand2" Height = 495 Left = 1440 TabIndex = 1 Top = 120 Width = 1215 End Begin VB.CommandButton Command1 Caption = "Co&mmand1" Height = 495 Left = 120 TabIndex = 0 Top = 120 Width = 1215 End End Attribute VB_Name = "Form1" Attribute VB_Creatable = False Attribute VB_Exposed = False
You can read a lot more about the Visual Basic file format in the Programmers Guide manual. The essential rule of thumb to remember is that these values are occasions of properties differing from their defaults.
The lines you are interested in are these:
Caption = "Com&mand2" Caption = "Co&mmand1"
I hope you agree that detection of multiple occurrences of characters preceded with & (except the ampersand character itself!) within lines that begin with
Caption = "
is not a desperately difficult task—the sort of thing that could be done in MS-DOS QBasic on a 1981 twin-floppy PC with 512K RAM! Not a Windows API in sight!
There are many different faults (or at least, sub-optimal practices) that can be found by programmatic examination of the source code.
Here are 50 that I have thought of (in no particular order). See if you can think of any more. Note that some are more applicable to Windows 3.1 than Windows 95. Because it is almost certain that our applications will still have to be able to run on
Windows 3.1 systems for the foreseeable future (whether native 16 bit or via Win32s) you should take that into account.
Some tests may be performed on the source code itself.
It is extremely important to ensure that all your source files each contain the Option Explicit line.
Its purpose is to force you to declare variables before use which not only forces you to consider the scope and usage of the variables but also warns you of mistyped variable names that would otherwise auto-initialize to zero or the empty string. Also,
is it a typo if a variable ends in 'l' or '1'? Use of Option Explicit should help to prevent this kind of fault, but it's happened before so it might happen to you.
The DoEvents command allows other tasks to share the CPU within non-preemptive multitasking versions of Windows. All calls that take a substantial time to execute should have a DoEvents within or nearby.
If you open a file with a hardcoded I/O stream number, you run the risk that another routine within the program is already be using this number (or vice-versa), so always use the FreeFile command for file I/O.
Instead of doing
Open "myfile.txt" For Input As #1
do this instead,
Dim MyFileNumber As Integer MyFileNumber = FreeFile Open "myfile.txt" For Input As #MyFileNumber
Visual Basic 4.0 has Typelibs and predefined constants.
If you're using products such as Help Compilers that churn out huge files of global constants, you should beware of the problems this can cause with encapsulation of your code. There have also been many large Visual Basic 3.0 projects made lame by
global name or symbol table overflow.
Also note that the Global keyword has been replaced by Public in Visual Basic 4.0 (although it has been retained for reverse compatibility).
This is important for applications that may have to be used without a mouse. While most people prefer to use a mouse, you should not assume that machines (especially portables) will always have a mouse attached.
As mentioned earlier in the text, it is possible to allocate the same hot-key definition to more than one object. This is not a problem for objects that are mutually invisible, such as menus and objects within different tab dialogs.
Is there error handling within all Subs that define Form Events, preferably within all other Subs or Functions, too?
When a line of code that is syntactically correct is run but causes an illegal operation to occur (such as division by zero), the debugger halts the interpreter if you are in design mode or the program if you are in a compiled .EXE file.
This behavior can be altered by the addition of the On Error construct that enables graceful handling of error conditions without throwing the user out and losing his data. However, you should be careful when coding error handling that you don't hide
the existence of genuine coding errors from yourself by applying too much of a "blanket" approach. On Error Resume Next should always be viewed with the maximum doubt you can manage.
On the contrary, errors should be amplified as much as possible during the development process so that the error causes an immediate break in the program. It is especially useful to wrap DLL calls within their own Visual Basic functions to provide a
consistent approach to this coding-error-prone interface as well as to provide comprehensive return-code handling and trap-ping of returned error codes. Ideally, you should leave out (or comment out) all Visual Basic-error handling code while in the
development process, replacing it at the final stages of the build. Look in the MSBASIC forum on CompuServe for VBRIG by Brad Kaenel (72357,3523), which is an excellent program to "rig" your finished code with error-handlers. It has other uses
for run-time analysis as I mention later. It's shareware so don't forget to GO SWREG.
See Chapter 5, "Error Trapping, Handling, and Reporting," for more details.
In general, if a sub or function is more than a couple of screensful long, you should consider breaking it down to smaller units for maintainability. Code not directly relevant to a .FRM should be moved to a .BAS module to allow faster form loading.
Excessively long lines should be broken down into smaller units that can be rapidly understood as individual concepts.
Recursion is when a function calls itself. It can be very useful if it is intentional.
If not intentional, it causes havoc by going on indefinitely. The usual scenario is a set of routines calling one of the others in turn and forming a logical loop, which results in your program going round and round until it runs out of stack and/or
memory.
It is possible to declare Dynamic Link Library functions that do not actually exist in the physical DLL. This often happens when the DLL is written by a team (C++ developers) other than the Visual Basic developers and the two developments go out of
sync.
As the functions are only looked for at run-time when required, this kind of omission can be missed in testing if the omitted function is only called in one or two obscure places. Be sure that the users will manage to find it, though.
Prevent these errors by using the EXEHDR tool (that comes with the Windows SDK or with VC++) to list the exported functions in all the DLLs specified in all the referenced declare lines in your code. If you find any functions that are declared but are
not in the EXEHDR output, investigate immediately.
See Chapter 20 for an example of how to check that the DLL is actually the correct one.
It's useful to avoid data conversions wherever possible because they consume processor time and may lose accuracy.
For example, assigning the contents of a Double (an 8-byte floating-point number) into a Single (a 4-byte floating-point number) and back again replaces the second seven significant figures of the original number with random digits. Likewise, if the
left hand side of a line of code is a long or an integer, you run the risk of losing accuracy if the right hand side involves floating-point variables or division because the fractional part of the answer will be lost.
There is a convention used among C programmers called Hungarian notation where the variable names are prefixed with a few letters to signify the datatype and possibly its intended usage. Alternatively you can use the old BASICA suffix convention of %,
!, $, and so on; but this is often viewed as being a little ugly.
The point of this convention is that you can see if the datatypes match up at a glance.
If the cursor jumps around the screen at random whenever Tab is pressed, it will be more difficult for the user to input data to your application. You must decide for yourself what is the most logical, but going from top/left of the screen to
bottom/right is usually a good way to start.
Scattering explicitly numeric constants around your code is one of the best ways to make it unmaintainable and error prone. Anyone who wants to read your code will not have a clue what the constants are for.
If someones task is to alter a constant used in more than one place to be another value, they might easily miss one of the occurrences. There is the further complication that two numeric entities might have the same value and it will be difficult to
discern which is which in the code.
It is arguable that 0 and 1 in the context of For ptr = 1 To limit are valid exceptions to this rule.
Spelling mistakes are one of the easiest ways to reduce the users' confidence in your testing procedures due to the mistakes' visibility. You should also ensure that your warning and error messages are comprehensible.
Write a small program to extract all the lines with double-quote marks, remove the characters before the first double-quote in each line and put the remainder through the spellchecker in Word or some other word processor. Don't forget to ensure that you
are also using the correct words even if the spellcheck passes. For example, look for "their" instead of "there" and vice versa.
Do this check before every formal release of your application.
This is a fault often made by C programmers who have moved on to Visual Basic. In Visual Basic, each variable has to be explicitly given a datatype, or else it takes the default datatype, which is usually variant unless altered with the Def??? function.
To make Var1 and Var2 both strings, use
Dim Var1 As String Dim Var2 As String
You should always try to combat Visual Basic 4.0's tendency to make anything it possibly can into a variant.
The ByVal keyword is used to send a throwaway copy of the variable's data to a function instead of the variable itself. Therefore, you should always use the ByVal keyword to pass data to a function unless you specifically want the function to be able to
modify that variable's data. Doing this should prevent corrupt data from "escaping" from the offending function.
There is an unusual use of the ByVal keyword in the context of passing variable-length strings to DLL functions. Because the C language uses a different internal representation of strings than Visual Basic 4.0, it is necessary to always use the ByVal
keyword to instruct Visual Basic 4.0 to translate between the formats when passing or receiving variable-length string data to or from a DLL.
Fixed length strings (either stand-alone or within user-defined types) do not share this use of the ByVal keyword.
Variables must be given a value before being used. If one of these operations is missing or the order of these operations is reversed, there is almost certainly an error.
It is bad practice to rely on BASIC's feature of auto-initializing variables on first use. You don't always know when that first use will be!
This is a minor optimization issue in that passed parameters must be expanded to WORD boundaries before use in a DLL.
If you arrange your user-defined types to have the 8-byte variables first (Double and Currency), followed by the 4-byte variables (Single and Long), followed by the 2-byte variable (Integer), followed by the remainder, you will naturally prealign most
of your UDT contents on WORD boundaries. See the VC++ documentation (if you have it) on the /Zp parameter for more information.
If your application cannot run perfectly alongside another instance of itself, you should check for use of App.PrevInstance in the start-up form and exit immediately if this is true.
If you include a variable length string within a user-defined type and save it to disk with a Put command, you save the string's current size and memory address instead of its contents. This isn't useful for very long.
Global variables are anathema to function encapsulation. If they are set in more than one place you increase your lack of program control by an order of magnitude.
If you are able to exit a routine via more than one place you will not be able to trace your program execution without a lot more care and attention. If you are stepping over functions, it won't always be obvious what was executed within the functions
and what wasn't.
Sometimes, lines of code are able to be moved outside a loop with no change to the results produced. For example, the following Var2 line can be moved outside the For...Next loop, meaning that it is only executed once instead of a hundred times, which
aids run-time per-formance.
For ptr = 1 To 100 Var1 = Var1 + ptr Var2 = 10 Next ptr
Also, it is much quicker to get the value of a variable than to get the value of an object property, so don't refer to heavily used constant property values explicitly. Assign the property to a temporary variable and use that instead. Be sure that the
property will remain constant before you do this.
For example, if you are aligning many screen objects dynamically with respect to the dimensions of the form itself, you can improve the speed by assigning the Form dimension properties to temporary variables and using the temporary variables instead.
Unless there is an agreed and pervasive local standard where single letter variables are appropriate, variable names should always be long enough to make their purpose clear (without being so long that they overflow the code window and raise the chances
of typos).
For example, actuaries might prefer to use a variable N(x) in their calculations instead of AnnuityCommutationFactorNumerator(CurrentAge), but if you're sensible you'll improve your code legibility.
A lot of DLL functions return integers to indicate whether the function executed correctly or not. If it didn't, the value of that integer indicates the nature of the failure. You should always enclose DLL calls with a Visual Basic wrapper function to
handle all the possibilities smoothly.
One of the best things about Visual Basic so far has been the fact that it's almost impossible to cause a General Protection Fault if you stick to native Visual Basic code and MS-packaged VBXs. If you get a GPF in Visual Basic you can be sure there's
some rogue DLL-coding lurking somewhere. It's much easier to watch the actual data flowing to and from Visual Basic to C++ if you limit the crossover point to a single instance instead of many.
People use your application's Help file when they need to know certain facts to enable them to carry on with their tasks, so it is usually very frustrating to be told that the Help topic does not exist or even given the wrong Help topic altogether. Read
the "Testing Help Files" chapter in the Help Compiler section of the Visual Basic 4.0 manuals for more information.
Help compilers usually generate large files of global constants to maintainably represent the set of HelpContextIDs implemented in the Help file. You should check that your application does not use any old constant names that have since ceased to have
any meaning within the .HLP file.
Read Appendix D (Specifications and Limitations) in the Visual Basic 4.0 Programmer's Guide manual. Are you likely to overflow any of the constraints mentioned there?
On Error GoTo may be classed as a useful command in making your programs more resilient.
On the other hand, pure GoTo and GoSub are BASICA reverse compatibility commands that have no sensible use in a Windows 95 world centered around encapsulated objects. They are the surest indicator of spaghetti code you can find.
The following code is a simple example of how a logically incomplete If...Then statement has led to a function that does something only for a very limited set of input possibilities. This should happen only if you have deliberately made that choice.
Function MyFunct(MyParam As Integer) As String If MyParam = 1 Then MyFunct = "One" ElseIf MyParam = 2 Then MyFunct = "Two" End If End Function
You should always ensure all possibilities are catered for, usually by including a final Else statement before each End If to catch any default cases.
If you are using a special font that is not supplied as standard with Windows (for example, an unusual corporate standard font) you should ensure that the font exists on any target machine on which you install your application. This is especially
important because the error returned (Property Not Found) does not particularly indicate a missing font.
Here's a simple example of a project where the nature of Var1 is ambiguous without careful study of the code.
MODULE1.BAS Global Var1 As String FORM1.FRM Private Sub Form_Load() Dim Var1 As Integer End Sub
Use of a consistent variable naming notation (for example, Hungarian notation) helps to prevent this source of confusion.
If you find you're having to spend time with your fingers against the screen to work out what code is where within a forest of brackets, you should split the line into smaller units. This will help you debug a line because you can easily watch the
temporary variables' values.
If a certain function is very heavily used in only one place in your code and performance needs to be improved in that area, you might try replacing the function call with the contents of the function—enclosed with comments to state what has
happened.
This saves the overhead of a (far) function call every time the inlined code is executed.
Since the first version of Visual Basic, toolbox objects have had default properties. For example, you may access the Text property of a textbox using just Text1 instead of Text1.Text.
The With...End With construct introduced with Excel 5.0 and Visual Basic 4.0 allows you to perform a similar operation with the leading qualifiers of a property value. See the Visual Basic 4.0 manual for examples.
Both of these can save some run-time interpretation overhead.
These can make it visually difficult to match Ifs with End Ifs, particularly if line indentation has been disturbed or not implemented carefully in the first place.
If your code is being run on a machine without floating-point calculation hardware, you can often gain a significant speed increase by maximizing your use of integers: either the Integer, Long, or Currency datatypes. Beware of the different number of
bytes that may be used for an integer—2 bytes in 16-bit systems and 4 bytes in 32-bit systems!
You can allow integers to be used for control positions by ensuring that Scalemode is set to "Twips" or "Pixels."
Programmers who have not spent much time programming for Windows NT applications are often unaware that it is dangerous to assume that a character always occupies one byte. In the UNICODE text convention used within Windows NT, each character occupies
two bytes, to enable more than 256 international characters to be described.
There are many articles on this subject in recent issues of the Microsoft Systems Journal as well as within the MS Knowledge Base.
Here are some examples from the MS-KB library:
Q89295 Unicode Conversion to Integers Q99884 Unicode and Microsoft Windows NT Q100639 Unicode Support in the MS Foundation Class Library Q103977 Unicode Implementation in Windows NT 3.1 and 3.5 Q109199 INF: Using Double-byte Character Sets with SQL Server Q130052 Tips for Converting from ASCII or ANSI to Unicode
The US convention for abbreviated dates is MM/DD/YY, whereas the European convention is DD/MM/YY. If you want your program to work sensibly both here and abroad for more than 12 days each year, you should consider using "mmm" as your format
string within Format$() instead of "mm."
This is equally important for documentation, where style and meaning of phrases can be subtly different between countries. You might use UK English spelling if you are producing a specific version of your program for use there (that is,
"colour" instead of "color"). Likewise, any text that refers to "Invoking the Serialization Facility" will cause great mirth among any readers who prefer to "Save the file."
See Chapter 28 ("International Issues") in the Programmer's Guide manual for more Visual Basic-specific considerations like this.
Some controls map directly to Windows control classes and so occupy resources when used. Some are internal to Visual Basic and don't. If resources are tight, you should try to use the Visual Basic ones. For example, use a label instead of a read-only
textbox. Likewise the Image control is a lightweight version of the Picture control.
If you have too many controls on the same form, it makes the form difficult to use. If you employ dynamic positioning of controls to match form resizing, you run the risk of controls overlapping.
If you have many command buttons, you might consider the use of a Buttonbar control.
Generally, if a control is never visible or enabled it should be removed from the project to reduce complication and Windows resource requirements. Sometimes, invisible controls may serve a useful function. For example, invisible listboxes may be used
for quick-and-dirty, no-code, auto-sorting storage for small amounts of text data.
Use of this Form, Frame, or Picturebox property can substantially increase user interface speed by repainting only the screen image of the object that has changed instead of the whole form. Use with care, though, because lightweight controls such as
Labels and Images will be painted over if they ever overlap with any Windows graphical controls.
Don't scatter related functions over many different code modules. It makes it difficult to keep track of them, particularly if they are designed to implement a replaceable programming in-terface.
If you've got many different systems to interface to, it's much easier to swap different complete modules in and out than to hunt around for sets of individual functions.
If your application does a lot of work at start-up, it's a good idea to get something on the screen immediately to reassure the user that something is happening. If you don't have a splash screen such as a bitmap of your corporate logo to show, put a
Me.Show statement as soon as possible in the Form_Load() function.
If you still need to support DOS versions of your software, you will have to accommodate the mismatch in available commands between the Windows versions and the DOS version of Visual Basic (which has not been updated since 1992).
For example, many commands such as PRINT USING are missing from Windows versions while the Access database functionality is missing from the DOS version.
If you read the "Code Limitations" section in the "Specifications and Limitations" section in Appendix D of the Programmer's Guide manual, you will see that Visual Basic uses internal 64K tables that hold dynamic information used at
run-time. The 16-bit version has tighter limitations than the 32-bit version, so be careful if you intend to reverse-migrate from 32-bit to 16-bit.
It's worth checking to make sure large projects don't push these limits. Common causes of this include large include files of DLL declarations supplied with third-party libraries or files of global constants generated by Help compilers. Look in the
MSBASIC forum on CompuServe for VBSPAC.ZIP, which is an excellent program to analyze your code for potential overflow of these tables. Check for a Visual Basic 4.0 version (as Visual Basic 3.0 did not have a separate internal table for DLL declarations).
Some tests have to be done on the program while it is running.
If your application needs to run alongside other applications such as MS Office, you should do all you can to minimize its Windows resources, because Win3.1 systems (which limit you to 64K for each of their GDI and Usr tables) will still be used for a
long time yet.
Verify that the amounts of free GDI and Usr resources are the same on application exit as when it started; if not, some of your components have not been correctly unloaded.
You can even take these checks down to function level if required. There's a Windows API function (GetFreeSystemResources) that you can call at appropriate times, writing the results to a file in a RAM disk so that speed doesn't suffer too much.
Declare Function GetFreeSystemResources Lib "User" (ByVal SysResType As Integer) As Integer For GDI, call GetFreeSystemResources(1), For Usr, call GetFreeSystemResources(2).
If you want to go for this in a big way, download VBRIG (an error handler, code rigger by Brad Kaenel (72357,3523)) from the MSBASIC forum in CompuServe. (It's shareware, so don't forget to GO SWREG!)
Use VBRIG to bulk-add function calls to the entry and exit points of each sub or function in the project. You can then replace his VBRIGSTD.BAS module with your own 10-line one, consisting of a call to the GetFreeSystemResources() API function followed
by an append of the data to a RAM disk file. You should Open and Close the file for each Append to ensure the RAM disk file is complete should your program crash.
Note that VBRIG was deliberately coded to be reversible. It does not rig any single-line If statements used to exit functions (because this would mean altering the line itself), so you must split lines like
If {condition} Then Exit Function into If {condition} Then Exit Function End If
otherwise, these function exit points will be missed.
Another thing you can use a VBRIGed project for is to write the time difference between sub exits and entries to your RAM disk file. You can then perhaps home in on the slow points of the program. This is of restricted use unless the application is
being run by some kind of regression test/automation tool to give completely reproducible timings.
Be careful about the results of calling small functions many times because the overhead of doing this check can take a lot longer to execute than the function contents. Remove the rigging from such functions and try again.
You can also derive the number of times each function is called. The results may surprise you if there are timer functions with intervals set too low, for example. These results are also useful for regression test purposes in that you can derive a list
of the routines in your system that are not exercised by your test scripts and create new scripts to address the omissions.
Try printing the names of currently loaded forms in Form_Activate() events placed in every form. You might prefer to print to a file instead of the Debug object.
Private Sub Form_Activate () Dim ptr As Integer For ptr = 0 To Forms.Count - 1 Debug.Print Forms(ptr).Caption Next ptr End Sub
This can be important from a security point of view because if you want users to be able to partially leave the application as far as its password-screen level, you want to be sure that forms containing the last user's data do not remain in memory.
Modal forms demand that the user interact with them before allowing further progress. They are useful for handling serious errors, but you should almost always use Application Modal to prevent any further interaction with the application, rather than
System Modal which prevents any interaction with the whole of Windows.
You should try at least one color scheme different from the Windows default. You should also ensure that any Windows 95 or Plug-and-Play features, such as altering the screen resolution while your application is running, do not upset its operation.
Always remember that Prevention is better than cure. You should try to design your program to be testable. Test the components of your program at the earliest opportunity before the effects of the errors have time to propagate. Read the Visual Basic 4.0 manuals and a good book on software development theory before you start coding.